A Crowdsourcing Method for Obtaining Rephrased Questions

نویسندگان

  • Nobuyuki Shimizu
  • Atsuyuki Morishima
  • Ryota Hayashi
چکیده

We propose a method for obtaining and ranking paraphrased questions from crowds to be used as a part of instructions in microtask-based crowdsourcing. With our method, we are able to obtain questions that differ in expression yet have the same semantics with respect to the crowdsourcing task. This is done by generating tasks that give hints and elicit instructions from workers. We conducted experiments with data used for a real set of gold standard questions submitted to a commercial crowdsourcing platform and compared the results with those from a direct-rewrite method.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Crowdsourcing Multiple Choice Science Questions

We present a novel method for obtaining high-quality, domain-targeted multiple choice questions from crowd workers. Generating these questions can be difficult without trading away originality, relevance or diversity in the answer options. Our method addresses these problems by leveraging a large corpus of domain-specific text and a small set of existing questions. It produces model suggestions...

متن کامل

Perform Three Data Mining Tasks with Crowdsourcing Process

For data mining studies, because of the complexity of doing feature selection process in tasks by hand, we need to send some of labeling to the workers with crowdsourcing activities. The process of outsourcing data mining tasks to users is often handled by software systems without enough knowledge of the age or geography of the users' residence. Uncertainty about the performance of virtual user...

متن کامل

Generating Gold Questions for Difficult Visual Recognition Tasks

Gold questions are a standard mechanism to detect insincere workers on crowdsourcing platforms. They usually rely on the assumption that workers should obtain perfect accuracy on the task. In this work, we are interested in crowdsourcing difficult multi-class visual recognition tasks, for which this assumption is not met, and we propose a novel method for generating gold questions in this context.

متن کامل

Viability of Crowd-Volunteered Open Research Reviews

In this study, we examine the feasibility of reviewing research publications through crowdsourcing. We propose a post-publication review model where the research materials are posted online for the crowd volunteers to review. One such platform, named PuGLi, is prepared and the data collected from it is analyzed. The analysis raises some interesting questions related to the challenges of attract...

متن کامل

Perspectives on Crowdsourcing Annotations for Natural Language Processing1

Crowdsourcing has emerged as a new method for obtaining annotations for training models for machine learning. While many variants of this process exist, they largely differ in their method of motivating subjects to contribute and the scale of their applications. To date, however, there has yet to be a study that helps a practitioner to decide what form an annotation application should take to b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015